EEdesign Home Register About EEdesign Feedback Contact Us The EET Network
eLibrary


 Online Editions
 EE TIMES
 EE TIMES ASIA
 EE TIMES CHINA
 EE TIMES KOREA
 EE TIMES TAIWAN
 EE TIMES UK

 Web Sites
CommsDesign
   GaAsNET.com
   iApplianceWeb.com
Microwave Engineering
EEdesign
   Deepchip.com
   Design & Reuse
Embedded.com
Elektronik i Norden
Planet Analog
Semiconductor
    Business News
The Work Circuit
TWC on Campus


ChipCenter
EBN
EBN China
Electronics Express
NetSeminar Services
QuestLink


August 8, 2002



Value of Verification Fits Survival Profile

By Harry Foster
Integrated System Design

March 1, 2002 (12:34 p.m. EST)

Achieving functional closure, or ensuring that designs meet their functional specification from RTL to final layout, continues to be one of the greatest challenges for today's ASIC and system-on-chip design teams. One facet of that challenge is shortening the verification cycle while improving the quality of the functional validation process-and that requires new verification techniques.

Yet, when discussing the option of augmenting a traditional simulation methodology with formal verification, the engineer is often left glassy-eyed with bewilderment. To demystify formal verification, I have created a succinct and practical survival guide, an aid to understanding the value of applying formal techniques to the traditional verification flow.

My goal is to build the necessary foundation and provide a practical outline for creating a synergistic simulation and formal verification flow-without delving under the hood of formal tools.

Functional verification, in general, is a process of demonstrating that a model of the design is functioning exactly as the engineer expects. Ultimately, the only thing that matters during the process of functional verification is high coverage-that is, ideally we would like to explore all combinations of input values with respect to all possible sequences of internal state. Without high coverage, corner cases go unexplored, resulting in functional bugs within the silicon.

Traditionally, vector-based verification techniques (such as simulation, acceleration and emulation) have been the primary process used for design validation, coupled with coverage techniques to expose unverified portions of the design (for example, functional coverage, state-machine arc coverage, line coverage, expression coverage and toggle coverage).

But vector-based functional verification techniques are inherently incomplete for all but the smallest designs. To illustrate that point, let's verify the functionality of a simple 32-bit comparator, as shown in Fig. 1.

Exhaustively verifying such a comparator using simulation requires 264 vectors. Assuming our simulator was able to evaluate one vector every microsecond (which is incredibly fast), it would take 584,942 years to complete the simulation run. In general, verification engineers apply techniques to reduce the vector set (for example, removing redundancy and selecting targeted corner cases). That makes vector-based verification tractable. Yet, as demonstrated with our simple 32-bit comparator, it remains inherently incomplete.

Specifically, formal verification is a systematic process of ensuring-via the application of exhaustive algorithmic techniques-that a design implementation satisfies the properties of its specification. The key words here are systematic, properties, specification and exhaustive.

In an informal, vector-based verification process, the engineer interprets a written specification and then transforms it into a testbench, which drives simulation vector stimuli. The engineer then observes and validates responses from the design model. Many errors may be introduced into the testbench (and the design) because of the ambiguous interpretation of the written specification. In a formal verification process, the engineer creates a formal specification, which unambiguously describes properties of the design in a mathematical form. The formal engine then algorithmically, and exhaustively, proves that the design implementation complies with the formally specified property.

Returning to our simple 32-bit comparator example, the engineer might specify that each output port's behavior is valid with respect to its input-and that the <, = and > output ports are always mutually exclusive. Using the Accellera Open Verification Library (see www.verificationlib.org) as a form of specification, the engineer codes these properties in Verilog as follows:

assert_implication less (ck, reset_n, (aassert_implication equal (ck, reset_n, (a==b), (c_equ==1'b1));
assert_implication greater (ck, reset_n, (a>b), (c_grt==1'b1));
assert_one_hot #(0,3) mutex (ck, reset_n, {c_lss, c_equ, c_grt}).

The OVL property-for example, (a>b) ® (c_grt==1'b1)-is part of an unambiguous specification whose semantics are interpreted in a mathematical form that a formal engine exhaustively proves in a matter of seconds-as opposed to hundreds of thousands of years using simulation.

So what's wrong with this picture? Why hasn't vector-based functional verification been replaced by formal property checking?

The answer is scalability. Simulation-based methodologies, though inherently incomplete, scale to very large designs. Since exhaustive simulation is impractical, engineers intuitively understand that they must partition a very large design into a set of smaller blocks in order to increase the thoroughness of their verification.

Formal verification is no different. By applying exhaustive formal property checking at the block level during development and then integrating the formally verified blocks into a larger system simulation environment, we are able to achieve our goal of high verification coverage. Further, a synergistic verification methodology is constructed by combining formal and traditional verification techniques. For instance, properties specified for block-level formal verification are reused as monitors during system simulation. That improves the overall quality of the verification process by increasing observability (also referred to as white-box testing).

A key component of formal methods is the systematic approach to design through formal specification, as opposed to an ad-hoc design methodology. Another benefit of the systematic specification approach is that design errors often are initially uncovered (or prevented) through the reasoning process of expressing various design properties-without any form of verification.

The OVL is just one of many forms of specifying design assertions (that is, properties).

Currently, Accellera (www.accellera.org)-whose mission is to drive worldwide development and use of standards that enhance a language-based design automation process-is standardizing a formal property language through the efforts of its Formal Verification Committee (www.eda.org/vfv). Two of the formal property languages under consideration for standardization are Motorola's Cycle Based Verilog (CBV)) and IBM's Sugar.

These powerful and expressive formal property languages will enable engineers to specify properties and constraints for formal analysis (for example, property checking), specify functional coverage models to measure the quality of simulation, and develop pseudorandom constraint-driven simulation environments derived from formal specifications But it may be some time before commercial simulators support the languages.

Classes of properties
In the comparator example, I demonstrated an invariant or safety property, which specifies behavior that must hold for all time. Another class of property, referred to as liveness, specifies temporal behavior. While a safety property is used to describe something bad that should never happen (for example, c_grt is never true when a In a vector-based verification methodology, the testbench is responsible for modeling the design environment by generating only legal input stimuli to the design model. In formal verification, instead of creating a testbench, the user specifies the legal input behavior through a set of properties referred to as constraints. The set of constraints provides an abstraction of the environment and limits the formal engine search to legal state space. Constraints specify both the design environment's invariant behavior (for example, a memory's read address cannot equal its write address during a write cycle) and temporal behavior (for example, when signal c_enable rises, then on the next clock after the second data transfer, the signal c_pending must rise).

Specifying constraints on block-level interfaces is a lot of work and should not be underestimated. However, formally defining block-level interfaces minimizes misunderstandings of block behavior between designers and facilitates block reuse. Furthermore, the set of constraints can be leveraged in simulation as block-level interface monitors, thereby considerably reducing debug time by improving observability. Thus, the time spent on defining block-level constraints is more than recovered by reducing the time spent in debug.

Synergistic flow
Fig. 2 illustrates a synergistic simulation and formal verification flow-a survival guide that enables you to achieve your goal of shortening the verification cycle while improving the quality of the functional validation process. The key formal components in this flow are property checking, equivalence checking, semantic consistency checking, structural consistency checking and layout verification.

Formal property checking begins the synergistic flow by formally specifying block (or super-block) level interfaces-before RTL coding. I refer to these formal block-level interface specifications as verifiable contracts among multiple blocks and engineers. The formal interface specifications serve as constraint models during block-level property checking as well as interface monitors during simulation. Further, through this process, you resolve interface misconceptions with other engineers before RTL coding.

As you begin coding the RTL, you would add assertions (that is, properties) for any potential corner case concerns using the OVL or one of the formal property languages. For example, you might add assertions that specify that a queue or FIFO will never overflow or underflow, or that a specified expression (condition) will never occur. After you complete the block-level set of constraints, RTL model and set of assertions, you would begin functional verification. The RTL assertions would become a set of verification targets for the formal property checker.

Semiformal bug-hunting techniques, such as nonexhaustive limited cycle or bounded model checking, are applied to properties too complex for full formal analysis using property checking. In addition, your constraint model could generate constraint-driven pseudorandom stimulus into the design model for block-level simulation. This is also useful for bug-hunting properties that are too complex for the formal engine. If you find a bug during system simulation for which no assertion was specified, then you would augment your RTL model with a new assertion to capture the failing corner case. That would provide a monitor for the revised RTL design and would formally document the corner case, which would benefit design reuse.

Next is formal equivalence checking. When you complete block-level verification, and system simulation begins, the RTL design begins the journey through the synthesis process and physical flow. It is critical that you ensure equivalence is maintained during these various design transformations; otherwise, all your RTL functional verification efforts will have been for naught.

Historically, proving equality between the final netlist and the original RTL model has been problematic. One approach to overcome that challenge involved abandoning the RTL model in favor of the gate-level model for functional verification. An alternative approach was to run the design's regression suite on both the RTL and gate-level models while comparing the output results. Obviously, using simulation to prove equivalence is incomplete. Furthermore, both approaches hinder your project's time-to-market goals by creating a simulation bottleneck.

In our synergistic verification flow, gate-level simulation is replaced with formal equivalence checking. That dramatically reduces verification time by minimizing dependency on gate-level simulation-and providing exhaustive coverage of equality.

In formal semantic consistency checking, you must validate two components of equality: semantic consistency and logical consistency. Formal equivalence checking enables you to exhaustively establish logical consistency.

Before synthesis, however, you must eliminate what co-author Lionel Bening and I refer to as the "bad stuff" in our book Principles of Verifiable RTL Design. Essentially, this is any RTL modeling that yields presynthesis vs. post-synthesis simulation mismatches.

The following Verilog code illustrates a potential semantic consistency problem when a full_case synthesis pragma is used:

module mux (a,b,s,q);
output q;
input a, b;
input [1:0] s;
reg q;
always @(a or b or s) begin
case (s) //rtl_synthesis full_case
2'b01: q = a;
2'b10: q = b;
endcase
end
endmodule

Pre- vs. post-synthesis semantic inconsistency occurs if an error in the logic driving the "s" variable generates a value other than the alternatives 2'b01 and 2'b10. For the presynthesis simulation behavior, if "s" assumes an illegal value, then "q" behaves as a latch, holding its previous valid simulation value. However, the post-synthesis model contains no latch. That results in a prospective pre- vs. post-synthesis simulation difference, which means that you might miss a potential functional bug during RTL simulation that you could uncover during gate-level simulation-for logically equivalent circuits. Of course, the goal is to eliminate the gate-level simulation bottleneck while ensuring exhaustive equality coverage.

Note that formal equivalence checking of the pre-vs. post-synthesis models does not help you in this case, since the two designs are logically consistent with the full_case pragma in the RTL (that is, their Boolean expressions are logically equivalent when the full_case "don't cares" are considered). Today's formal equivalence checking tools permit you to disable the synthesis pragma during the proof, which can aid in identifying a few semantic inconsistencies.

However, this is only a partial solution that burdens you with debugging many false negatives. A complete formal solution requires full sequential analysis of the RTL to determine the safety of synthesis pragma usage, X assignments, variable index range overflow and other conditions that might lead to semantic inconsistencies. This "don't care" class of properties can be automatically extracted by a formal semantic consistency checker and then exhaustively verified.

As your design completes the progression through the physical flow, structural consistency checking (using formal verification) greatly reduces your dependency on gate-level simulation by formally verifying design structures (for example, tristate bus contention, clock-domain crossing errors and test logic errors).

You have spent a tremendous amount of time and resources functionally verifying your design. But there are still potential errors in the layout (for example, illegal use of library cells, errors caused by ECOs and netlist port inconsistencies) that you must catch.

To achieve functional closure, you must close the loop entirely, back to the original RTL. Traditionally, engineers have depended on a technique of generating a Spice netlist from the layout and comparing it with the Spice netlist for the schematic (layout vs. schematic comparison). For a faster (and more complete) solution, compare the Spice netlist directly back to the gate-level netlist using formal layout vs. RTL equivalence checking.

Formal verification has grown beyond the bounds of research and can be successfully integrated into your design flow, helping you achieve your goals of shortening the verification cycle while improving the quality of your functional validation process.

---
Harry Foster, Verplex Systems Inc. chief architect (Milpitas, Calif.), chairs the Accellera Formal Verification Committee, which is set to define a language standard for property specification. He co-authored Principles of Verifiable RTL Design, Kluwer Academic Pubs, June 2001.

http://www.isdmag.com
Copyright © 2002 CMP Media LLC
3/1/02, Issue # 14153, page 30.




 



 

Newsletters!
The EE Times Network offers engineers newsletters for nearly any discipline. Interested in EDA, check out the EEdesign Newsletter. Want EE news, check out the EE Times Newsletter. Everything from Test and Measurement to embedded programming to comms and analog design is available from the EE Times Network. Sign up for FREE newsletters NOW!
Comms Conference Savings
Attend the Communications Design Conference for as little as $545! Registration packages starting as low as $545! A world-class advisory board and speaker panel with more than 175 faculty members from over 100 leading-companies will present over 60 sessions. Register before August 26 and save up to $400!
Supply Network Conference
To be held Sept. 17-19 at The Fairmont Hotel in San Jose, Calif., this conference will kick-off with a keynote address by Michael Marks, chairman and CEO of Flextronics Corp., and will feature other noted speakers from the electronics industry. The complete program is available online. Register now for the Supply Network Conference.
 

Home  |  Register  |  About  |  Feedback  |  Contact
Copyright © 2001 CMP Media, LLC
Terms and Conditions  |  Privacy Statement